Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wasm build #93

Merged
merged 10 commits into from
Apr 4, 2021
Merged

Wasm build #93

merged 10 commits into from
Apr 4, 2021

Conversation

dnlup
Copy link
Contributor

@dnlup dnlup commented Mar 8, 2021

👋🏻 There.

This PR is based on the work started by @devsnek in https://github.com/devsnek/llhttp/tree/wasm.

A little background

We needed a wasm build of llhttp to implement a custom parser in nodejs/undici#575.


The objective of these changes is to expose the required pieces in api.(h|c) to implement a parser using a web assembly build of llhttp and add the build logic to build the binary. We also needed to run the build in a docker container (nodejs/undici#22 (comment)).

Given the tight coupling of the changes with the pieces used to build the c parser, we thought it was worth a shot opening a PR to discuss integrating these changes upstream.

@dnlup dnlup mentioned this pull request Mar 8, 2021
7 tasks
@dnlup
Copy link
Contributor Author

dnlup commented Mar 8, 2021

cc @mcollina @ronag

Copy link
Member

@ronag ronag left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the package-lock.json should be here

Add wasm build using a docker container. Expose some api needed only in
wasm.
@devsnek
Copy link
Member

devsnek commented Mar 8, 2021

I think llhttp should abstract away the wasm instantiation, callback tracking, reading strings etc. maybe something like:

const llhttp = require('llhttp/wasm');

const parser = new llhttp.Parser(llhttp.RESPONSE, {
  onX() {},
  onY() {},
})

parser.execute(bytes); // might throw instead of returning an errno

You could also export llhttp.wasmModule so people can create their own instances.

@ronag
Copy link
Member

ronag commented Mar 8, 2021

Creating our instance is what we've done over at nodejs/undici#575. Where we on purpose only have implemented RESPONSE. Would it be sufficient to just do llhttp.wasmModule for now?

@devsnek
Copy link
Member

devsnek commented Mar 8, 2021

hmm I was thinking the logic would still be implemented by the user, like they'd put an llhttp.Parser inside their HTTPParser (like how node uses llhttp_t inside its HTTPParser). the interface on the llhttp side would just be responsible for converting errors and strings and keeping the callbacks consistent. I'm not maintaining this though so you do you.

Also a sidenote, there's a useful crossover between versions of node that expose the internal http parser, and versions of node that support FinalizationRegistry, so I think you could hardcode gc on this side.

@mcollina mcollina requested a review from indutny March 8, 2021 15:07
@dnlup
Copy link
Contributor Author

dnlup commented Mar 8, 2021

hmm I was thinking the logic would still be implemented by the user, like they'd put an llhttp.Parser inside their HTTPParser (like how node uses llhttp_t inside its HTTPParser). the interface on the llhttp side would just be responsible for converting errors and strings and keeping the callbacks consistent. I'm not maintaining this though so you do you.

Also a sidenote, there's a useful crossover between versions of node that expose the internal http parser, and versions of node that support FinalizationRegistry, so I think you could hardcode gc on this side.

I like this idea. Given that we are still in an early stage, as @ronag said even just a wasm module is enough for now if it's not a problem of course. There's a lot of room for improvement. I would be ok with merging this in a custom branch if it's better.

@mcollina
Copy link
Member

mcollina commented Mar 8, 2021

It really depends what @indutny prefers :).

@dnlup
Copy link
Contributor Author

dnlup commented Mar 8, 2021

It really depends what @indutny prefers :).

Yes, of course

@indutny
Copy link
Member

indutny commented Mar 9, 2021

Yay. WASM!

I think we should strive for a saner API like @devsnek has posted, but having raw WASM support as an unstable feature is acceptable too. Not locking into JS API means that you'd have more room to experiment with it an undici, right?

@dnlup
Copy link
Contributor Author

dnlup commented Mar 9, 2021

Not locking into JS API means that you'd have more room to experiment with it an undici, right?

Yes, exactly.

I think the JS API will organically define itself while we experiment (I would be happy to try to implement it). Defining a JS API here would also mean that we would have to install llhttp as a dependency in undici, and we would like to avoid any dependencies (formally declared in package.json) atm.

@ronag
Copy link
Member

ronag commented Mar 9, 2021

Just for reference:

It would be nice if the wasm build lived in undici. That way we could apply undici specific optimizations in the wasm api and reduce the number of wasm->js calls. I think we could move more of the HTTPParser logic into native/wasm land.
e.g. wasm_on_header_field and wasm_on_header_value might be possible to combine into wasm_on_headers.

i.e. if we put wasm in llhttp we might want to do some semver major (?) changes in the future related to optimizations of the API. If that makes sense?

@indutny
Copy link
Member

indutny commented Mar 11, 2021

So is this ready for review?

@mcollina
Copy link
Member

Yes it is.

@ronag
Copy link
Member

ronag commented Mar 17, 2021

@indutny any idea when you might have time to review this?

@indutny
Copy link
Member

indutny commented Mar 18, 2021 via email

Copy link
Member

@indutny indutny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks very good, but I've left few nitpicks/requests that have to be resolved before merging. Thank you for submitting it, and sorry for a long delay on my end.

build_wasm.js Outdated Show resolved Hide resolved
examples/wasm.js Outdated Show resolved Hide resolved
examples/wasm.js Outdated Show resolved Hide resolved
examples/wasm.js Outdated Show resolved Hide resolved
examples/wasm.js Outdated Show resolved Hide resolved
src/native/api.c Outdated Show resolved Hide resolved
@dnlup
Copy link
Contributor Author

dnlup commented Mar 29, 2021

This looks very good, but I've left few nitpicks/requests that have to be resolved before merging. Thank you for submitting it, and sorry for a long delay on my end.

No worries, thank you for the review. I'll work on it!

@dnlup
Copy link
Contributor Author

dnlup commented Apr 2, 2021

To reiterate:

these changes add some scripts to build wasm. Nothing is exposed in the package itself. If one wanted to use the build, they would have to copy the build/wasm folder in their project and write their own parser.

I see a lot of potential with wasm. The next step would be exposing a set of API at the package level that allows the user to install llhttp as a dependency and do something like:

import { createParser, constants } from 'llhttp';

const parser = createParser(constants.TYPE.RESPONSE, {
  ...// other options
})

As you already suggested.

This reduces the context size from a few MB to a few KB.
Minor cleanup in `examples/wasm.ts`.
Copy link
Member

@indutny indutny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome work. Thank you so much for taking time to address my review comments!

mkdirSync(join(WASM_SRC, 'build'));
process.exit(0);
} catch (error) {
if (error.code !== 'EEXIST') {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice touch!

@indutny indutny merged commit a620012 into nodejs:master Apr 4, 2021
@indutny
Copy link
Member

indutny commented Apr 4, 2021

Released in v5.0.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants